Showing posts with label Health Issues. Show all posts
Showing posts with label Health Issues. Show all posts

4/25/2013

Gut bacteria linked to obesity

Researchers at the University of Maryland School of Medicine have identified 26 species of bacteria in the human gut microbiota that appear to be linked to obesity and related metabolic complications. These include insulin resistance, high blood sugar levels, increased blood pressure and high cholesterol, known collectively as "the metabolic syndrome," which significantly increases an individual’s risk of developing diabetes, cardiovascular disease and stroke.

"We identified 26 species of bacteria that were correlated with obesity and metabolic syndrome traits such as body mass index (BMI), triglycerides, cholesterol, glucose levels and C-reactive protein, a marker for inflammation," says the senior author, Claire M. Fraser, Ph.D., professor of medicine and microbiology and immunology and director of the Institute for Genome Sciences (IGS) at the University of Maryland School of Medicine. "We can’t infer cause and effect, but it’s an important step forward that we're starting to identify bacteria that are correlated with clinical parameters, which suggests that the gut microbiota could one day be targeted with medication, diet or lifestyle changes."The results of the study, which analyzed data from the Old Order Amish in Lancaster County, Pa., are being published online on Aug. 15, 2012, in PLOS ONE, which is published by the Public Library of Science (PLOS One). The study was funded by the National Institutes of Health (NIH). (UH2/UH3 DK083982, U01 GM074518 and P30 DK072488)
Dr. Fraser says that additional research, including an interventional study with the Amish, is essential. "We can look at whether these bacteria change over time in a given individual or in response to diet or medication," she says.
Dr. Fraser notes that the research team, led by Margaret L. Zupancic, Ph.D., then a postdoctoral fellow at IGS, also found an apparent link between the gut bacteria and inflammation, which is believed to be a factor in obesity and many other chronic diseases. "This is one of the first studies of obesity in humans to make a link between inflammatory processes and specific organisms that are present in the GI tract," Dr. Fraser says, noting that participants with metabolic syndrome who had elevated serum markers associated with inflammation tended to have the lowest levels of good bacteria that have been reported previously to have anti-inflammatory properties.
The study is the result of an ongoing collaboration between Dr. Fraser and Alan R. Shuldiner, M.D., in connection with the NIH’s Human Microbiome Project, which seeks to characterize microbial communities in the body. Dr. Shuldiner, associate dean for personalized medicine and director of the Program in Personalized and Genomic Medicine at the University of Maryland School of Medicine, operates an Amish research clinic in Lancaster Pa. Over the past 20 years, he and his research team have conducted more than a dozen studies with the Amish, looking for genes that may cause common diseases, such as diabetes, osteoporosis and cardiovascular disease.
"The Old Order Amish are ideal for such studies because they are a genetically homogenous population descended from a few founder families and have a similar rural lifestyle," Dr. Shuldiner, the John L. Whitehurst Professor of Medicine, says. "We believe the results of this study are relevant to a broader population because the clinical characteristics of obesity and its complications in the Amish are no different from the general Caucasian population," he says.
E. Albert Reece, M.D., Ph.D., M.B.A., vice president for medical affairs at the University of Maryland and the John Z. and Akiko K. Bowers Distinguished Professor and dean of the University of Maryland School of Medicine, says, "Obesity and its related complications have become a critical public health concern, and the number of people who are now considered obese or overweight has skyrocketed. Dr. Fraser and Dr. Shuldiner are two of our most senior research-scientists and leaders in their respective fields. This study provides valuable insights into the role the bacteria in our bodies may play in obesity and the metabolic syndrome. We may ultimately be able to target the gut microbiome to help prevent or mitigate risk factors for a number of diseases."
The researchers analyzed the bacteria in fecal samples of 310 members of the Old Order Amish community, using a process that enables them to identify a marker gene that serves as a bar code for each type of bacteria. Participants in the study ranged from lean to overweight to obese; some of the obese participants also had features of the metabolic syndrome. "Our hypothesis was that we would see a different composition in the gut microbiota in lean vs. obese individuals and possibly in individuals who were obese but also had features of the metabolic syndrome."
They discovered that every individual possessed one of three different communities of interacting bacteria, each characterized by a dominant bacterial genus. Neither BMI nor any metabolic syndrome trait was specifically associated with any of these communities. Instead, differing levels of 26 less abundant bacterial species present in all individuals appeared to be linked to obesity and certain features of the metabolic syndrome.
Interestingly, researchers also analyzed people's gut bacteria by their occupation and found that those who had regular contact with livestock, such as farmers and their wives, had bacterial communities dominated by Prevotella, a type of bacteria that is also abundant in the gut microbiota of cattle and sheep. "These findings suggest that environmental exposure may play a role in determining the composition of the gut microbiota in humans," Dr. Fraser says.

University of Maryland Medical Center (2012, August 15). Gut bacteria linked to obesity and metabolic syndrome identified. ScienceDaily. Retrieved April 26, 2013, from http://www.sciencedaily.com­/releases/2012/08/120815174902.htm?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+sciencedaily+%28ScienceDaily%3A+Latest+Science+News%29

4/24/2013

Scientists Can Now Block Heroin, Morphine Addiction

In a major breakthrough, an international team of scientists has proven that addiction to morphine and heroin can be blocked, while at the same time increasing pain relief.



Laboratory studies have shown that the drug (+)-naloxone (pronounced: PLUS nal-OX-own) will selectively block the immune-addiction response.The team from the University of Adelaide and University of Colorado has discovered the key mechanism in the body's immune system that amplifies addiction to opioid drugs.
The results -- which could eventually lead to new co-formulated drugs that assist patients with severe pain, as well as helping heroin users to kick the habit -- will be published August 16 in the Journal of Neuroscience.
"Our studies have shown conclusively that we can block addiction via the immune system of the brain, without targeting the brain's wiring," says the lead author of the study, Dr Mark Hutchinson, ARC Research Fellow in the University of Adelaide's School of Medical Sciences.
"Both the central nervous system and the immune system play important roles in creating addiction, but our studies have shown we only need to block the immune response in the brain to prevent cravings for opioid drugs."
The team has focused its research efforts on the immune receptor known as Toll-Like receptor 4 (TLR4).
"Opioid drugs such as morphine and heroin bind to TLR4 in a similar way to the normal immune response to bacteria. The problem is that TLR4 then acts as an amplifier for addiction," Dr Hutchinson says.
"The drug (+)-naloxone automatically shuts down the addiction. It shuts down the need to take opioids, it cuts out behaviours associated with addiction, and the neurochemistry in the brain changes -- dopamine, which is the chemical important for providing that sense of 'reward' from the drug, is no longer produced."
Senior author Professor Linda Watkins, from the Center for Neuroscience at the University of Colorado Boulder, says: "This work fundamentally changes what we understand about opioids, reward and addiction. We've suspected for some years that TLR4 may be the key to blocking opioid addiction, but now we have the proof.
"The drug that we've used to block addiction, (+)-naloxone, is a non-opioid mirror image drug that was created by Dr Kenner Rice in the 1970s. We believe this will prove extremely useful as a co-formulated drug with morphine, so that patients who require relief for severe pain will not become addicted but still receive pain relief. This has the potential to lead to major advances in patient and palliative care," Professor Watkins says.
The researchers say clinical trials may be possible within the next 18 months.
This study has been funded by the National Institute on Drug Abuse (NIDA) in the United States and the Australian Research Council (ARC).
Source: University of Adelaide (2012, August 14). Scientists can now block heroin, morphine addiction. ScienceDaily. Retrieved April 24, 2013, from http://www.sciencedaily.com­/releases/2012/08/120814213246.htm

2/21/2012

Cocaine and the Teen Brain: New Insights Into Addiction


When first exposed to cocaine, the adolescent brain launches a strong defensive reaction designed to minimize the drug's effects, Yale and other scientists have found. Now two new studies by a Yale team identify key genes that regulate this response and show that interfering with this reaction dramatically increases a mouse's sensitivity to cocaine. 


The findings may help explain why risk of drug abuse and addiction increase so dramatically when cocaine use begins during teenage years. 

The results were published in the Feb. 14 and Feb. 21 issues of the Journal of Neuroscience. 

Researchers including those at Yale have shown that vulnerability to cocaine is much higher in adolescence, when the brain is shifting from an explosive and plastic growth phase to more settled and refined neural connections characteristic of adults. Past studies at Yale have shown that the neurons and their synaptic connections in adolescence change shape when first exposed to cocaine through molecular pathway regulated by the gene integrin beta1, which is crucial to the development of the nervous system of vertebrates. 

"This suggests that these structural changes observed are probably protective of the neurocircuitry, an effort of the neuron to protect itself when first exposed to cocaine," said Anthony Koleske, professor of molecular biophysics and biochemistry and of neurobiology and senior author of both papers. 

In the latest study, Yale researchers report when they knocked out this pathway, mice needed approximately three times less cocaine to induce behavioral changes than mice with an intact pathway. 

The research suggests that the relative strength of the integrin beta1 pathway among individuals may explain why some cocaine users end up addicted to the drug while others escape its worst effects, Koleske theorized. 

"If you were to become totally desensitized to cocaine, there is no reason to seek the drug," he said. 

Koleske and Jane R. Taylor, professor of psychiatry and psychology and an author of the Feb. 14 paper, are teaming up with other Yale researchers to look for other genes that may play a role in protecting the brain from effects of cocaine and other drugs of abuse. 

Shannon Gourley, now of Emory University who worked with Koleske and Taylor, is lead author on the Feb. 14 paper detailing how the structural response to cocaine protects against cocaine sensitivity. Anastasia Oleveska and Michael S. Warren are other Yale authors on this paper. Warren and William D. Bradley of Yale are co-lead authors of the latest Neuroscience paper describing the role for integrin beta 1 in the control of adolescent synapse and dendrite refinement and stability. Yu-Chih Lin, Mark A. Simpson, Charles A. Greer are other Yale-affiliated authors. 

Author: Bill Hathaway | Source: Yale University [February 21, 2012]

2/04/2012

Stressed kids more likely to become obese


The more ongoing stress children are exposed to, the greater the odds they will become obese by adolescence, reports Cornell environmental psychologist Gary Evans in the journal Pediatrics (129:1). 


Nine-year-old children who were chronically exposed to such stressors as poverty, crowded housing and family turmoil gain more weight and were significantly heavier by age 13 than they would have been otherwise, the study found. The reason, Evans and his co-authors suggest, is that ongoing stress makes it tougher for children to control their behavior and emotions -- or self-regulate. That, in turn, can lead to obesity by their teen years. 

"These children are heavier, and they gain weight faster as they grow up. A very good predictor of adults' ability to follow healthy habits is their ability to self-regulate. It seems reasonable that the origins of that are probably in childhood. This [research] is starting to lay that out," said Evans, the Elizabeth Lee Vincent Professor of Human Ecology in the Departments of Design and Environmental Analysis and of Human Development in Cornell's College of Human Ecology. 

Evans conducted the study with former students Thomas Fuller-Rowell, Ph.D. '10, now a Robert Wood Johnson postdoctoral fellow at the University of Wisconsin-Madison, and Stacey Doan, Ph.D. '10, an assistant professor of psychology at Boston University. 

The researchers measured the height and weight of 244 9-year-olds in rural New York state and calculated their various physical and psycho-social stressors -- for example, exposure to violence, living in a substandard house or having no access to such resources as books. They also measured the children's ability to delay gratification by offering them a choice between waiting for a large plate of candy versus having a medium plate immediately. The researchers measured the children's height and weight again four years later. 

While the study doesn't prove that a child's inability to delay gratification causes her to gain weight, there's strong evidence to suggest that it does, Evans said. First, previous studies have shown that chronic stress is linked to weight gain in children and teenagers, and that children eat more sugary, fatty foods when stressed. 

Second, there's a plausible neurocognitive mechanism that may help better understand this behavior, Evans said. "There's some evidence that parts of the brain that are vulnerable and sensitive to stress, particularly early in life, are some of the same parts involved in this self-regulatory behavior." 

The study has implications for education policies such as No Child Left Behind that emphasize testing cognitive abilities but ignore children's ability to control their behavior and emotions, Evans said. 

"A child's ability to self-regulate is not just predictive of things like whether you're going to have trouble with weight -- it predicts grades, graduating from high school. A 4-year-old's ability to self-regulate even predicts SAT scores. This is a very powerful phenomenon," he said. 

The findings also have implications for interventions and policies aimed at reducing individual stressors. "If it's the cumulative impact of stress on these families that is important, that means an intervention that only looks at one stressor -- say, just drug abuse, which is how most interventions are designed -- is doomed to fail," Evans concluded. 

Author: Susan Kelley | Source: Cornell University [January 21, 2012]

2/01/2012

Why the brain is more reluctant to function as we age


New findings, led by neuroscientists at the University of Bristol and published this week in the journal Neurobiology of Aging, reveal a novel mechanism through which the brain may become more reluctant to function as we grow older. 


It is not fully understood why the brain's cognitive functions such as memory and speech decline as we age. Although work published this year suggests cognitive decline can be detectable before 50 years of age. The research, led by Professor Andy Randall and Dr Jon Brown from the University's School of Physiology and Pharmacology, identified a novel cellular mechanism underpinning changes to the activity of neurones which may underlie cognitive decline during normal healthy aging. 

The brain largely uses electrical signals to encode and convey information. Modifications to this electrical activity are likely to underpin age-dependent changes to cognitive abilities. 

The researchers examined the brain's electrical activity by making recordings of electrical signals in single cells of the hippocampus, a structure with a crucial role in cognitive function. In this way they characterised what is known as "neuronal excitability" — this is a descriptor of how easy it is to produce brief, but very large, electrical signals called action potentials; these occur in practically all nerve cells and are absolutely essential for communication within all the circuits of the nervous system. 

Action potentials are triggered near the neurone's cell body and once produced travel rapidly through the massively branching structure of the nerve cell, along the way activating the synapses the nerve cell makes with the numerous other nerve cells to which it is connected. 

The Bristol group identified that in the aged brain it is more difficult to make hippocampal neurones generate action potentials. Furthermore they demonstrated that this relative reluctance to produce action potential arises from changes to the activation properties of membrane proteins called sodium channels, which mediate the rapid upstroke of the action potential by allowing a flow of sodium ions into neurones. 

Professor Randall, Professor in Applied Neurophysiology said: "Much of our work is about understanding dysfunctional electrical signalling in the diseased brain, in particular Alzheimer's disease. We began to question, however, why even the healthy brain can slow down once you reach my age. Previous investigations elsewhere have described age-related changes in processes that are triggered by action potentials, but our findings are significant because they show that generating the action potential in the first place is harder work in aged brain cells. 

"Also by identifying sodium channels as the likely culprit for this reluctance to produce action potentials, our work even points to ways in which we might be able modify age-related changes to neuronal excitability, and by inference cognitive ability." 

Source: University of Bristol [February 01, 2012]

1/05/2012

Sexual satisfaction in women increases with age


A new study of sexually active older women has found that sexual satisfaction in women increases with age and those not engaging in sex are satisfied with their sex lives. A majority of study participants report frequent arousal and orgasm that continue into old age, despite low sexual desire. The study appears in the January issue of the American Journal of Medicine. 


Researchers from the University of California, San Diego School of Medicine and the Veterans Affairs San Diego Healthcare System evaluated sexual activity and satisfaction as reported by 806 older women who are part of the Rancho Bernardo Study (RBS) cohort, a group of women who live in a planned community near San Diego and whose health has been tracked for medical research for 40 years. The study measured the prevalence of current sexual activity; the characteristics associated with sexual activity including demographics, health, and hormone use; frequency of arousal, lubrication, orgasm, and pain during sexual intercourse; and sexual desire and satisfaction in older women. 

The median age in the study was 67 years and 63% were postmenopausal. Half the respondents who reported having a partner had been sexually active in the last 4 weeks. The likelihood of sexual activity declined with increasing age. The majority of the sexually active women, 67.1%, achieved orgasm most of the time or always. The youngest and oldest women in the study reported the highest frequency of orgasm satisfaction. 

40% of all women stated that they never or almost never felt sexual desire, and one third of the sexually active women reported low sexual desire. Lead investigator Elizabeth Barrett-Connor, MD, Distinguished Professor and Chief, Division of Epidemiology, Department of Family and Preventive Medicine, University of California, San Diego School of Medicine, comments, "Despite a correlation between sexual desire and other sexual function domains, only 1 in 5 sexually active women reported high sexual desire. Approximately half of the women aged 80 years or more reported arousal, lubrication, and orgasm most of the time, but rarely reported sexual desire. In contrast with traditional linear model in which desire precedes sex, these results suggest that women engage in sexual activity for multiple reasons, which may include affirmation or sustenance of a relationship." 

Regardless of partner status or sexual activity, 61% of all women in this cohort were satisfied with their overall sex life. Although older age has been described as a significant predictor of low sexual satisfaction, the percentage of RBS sexually satisfied women actually increased with age, with approximately half of the women over 80 years old reporting sexual satisfaction almost always or always. Not only were the oldest women in this study the most satisfied overall, those who were recently sexually active experienced orgasm satisfaction rates similar to the youngest participants. "In this study, sexual activity was not always necessary for sexual satisfaction. Those who were not sexually active may have achieved sexual satisfaction through touching, caressing, or other intimacies developed over the course of a long relationship," says first author Susan Trompeter, MD, Associate Clinical Professor of Medicine. Division of General Internal Medicine, Department of Medicine at the University of California, San Diego School of Medicine and Staff Physician at the VA San Diego Healthcare System. 

"Emotional and physical closeness to the partner may be more important than experiencing orgasm. A more positive approach to female sexual health focusing on sexual satisfaction may be more beneficial to women than a focus limited to female sexual activity or dysfunction," Trompeter concludes.  

Source: Elsevier Health Sciences via EurekAlert! [January 03, 2012]

12/20/2011

How pregnancy changes a woman's brain


We know a lot about the links between a pregnant mother’s health, behavior, and moods and her baby’s cognitive and psychological development once it is born. But how does pregnancy change a mother’s brain? “Pregnancy is a critical period for central nervous system development in mothers,” says psychologist Laura M. Glynn of Chapman University. 


“Yet we know virtually nothing about it.” Glynn and her colleague Curt A. Sandman, of University of the California Irvine, are doing something about that. Their review of the literature in Current Directions in Psychological Science, a journal published by the Association for Psychological Science, discusses the theories and findings that are starting to fill what Glynn calls “a significant gap in our understanding of this critical stage of most women’s lives.” 

At no other time in a woman’s life does she experience such massive hormonal fluctuations as during pregnancy. Research suggests that the reproductive hormones may ready a woman’s brain for the demands of motherhood—helping her becomes less rattled by stress and more attuned to her baby’s needs. Although the hypothesis remains untested, Glynn surmises this might be why moms wake up when the baby stirs while dads snore on. Other studies confirm the truth in a common complaint of pregnant women: “Mommy Brain,” or impaired memory before and after birth. “There may be a cost” of these reproduction-related cognitive and emotional changes, says Glynn, “but the benefit is a more sensitive, effective mother.” 

The article reviews research that refines earlier findings on the effects of the prenatal environment on the baby. For instance, evidence is accumulating to show that it’s not prenatal adversity on its own—say, maternal malnourishment or depression—that presents risks for a baby. Congruity between life in utero and life on the outside may matter more. A fetus whose mother is malnourished adapts to scarcity and will cope better with a dearth of food once it’s born—but could become obese if it eats normally. Timing is critical too: maternal anxiety early in gestation takes a toll on the baby’s cognitive development; the same high levels of stress hormones late in pregnancy enhance it. 

Just as Mom permanently affects her fetus, new science suggests that the fetus does the same for Mom. Fetal movement, even when the mother is unaware of it, raises her heart rate and her skin conductivity, signals of emotion—and perhaps of pre-natal preparation for mother-child bonding. Fetal cells pass through the placenta into the mother’s bloodstream. “It’s exciting to think about whether those cells are attracted to certain regions in the brain” that may be involved in optimizing maternal behavior, says Glynn. 

Glynn cautions that most research on the maternal brain has been conducted with rodents, whose pregnancies differ enormously from women’s; more research on human mothers is needed. But she is optimistic that a more comprehensive picture of the persisting brain changes wrought by pregnancy will yield interventions to help at-risk mothers do better by their babies and themselves. 

Source: Association for Psychological Science [December 20, 2011]

12/18/2011

New test to indicate likely spread or recurrence of breast cancer


A Queensland University of Technology (QUT) PhD student has developed a potential breakthrough test for predicting the likelihood of the spread or return of breast cancer. 


"While in recent years there have been fantastic advances in the treatment of breast cancer there has been no way of predicting its progress," said Helen McCosker, a PhD student at the Institute of Health and Biomedical Innovation (IHBI). 

Ms McCosker's research found that a breast cancer's interaction with its surrounding environment held the key to predicting whether it would grow, become dormant or spread to other organs. 

"The ability to predict its progress is a huge step forward as it will ultimately enable doctors to select the most appropriate treatments for individual patients," she said. 

"This test should identify those patients who need their cancer removed but require no further treatment, those who need the tumour removed but also require additional treatment, for example, chemotherapy, and those who need more vigorous treatments. 

"That will mean that patients should neither receive unnecessary treatments nor be undertreated when a more aggressive medical response is required." 

Ms McCosker said the new test would use the tissue surrounding the cancer cells, which were collected for biopsy purposes, but were currently not examined. 

"The test makes better use of tissue that's already being collected anyway, so from the patient's point of view there would be no change; no new test," she said. 

She said the next step was to develop an easy-to-use, accurate online program that doctors would use to diagnose cancer progression. 

"Ultimately, doctors should be able to key the results of the examination of tissue samples into an online program with built-in mathematical models and be presented with a clear answer as to the likelihood of cancer progression." 

She said the test would offer solutions for a wide range of patients, particularly those with more advanced, aggressive, disease that could spread to other organs as well as those in rural and remote areas with limited access to advanced medical services. 

"The next step is to seek financial backing to fine-tune and commercialise the current prototype. It's expected our models will be trialled in pathology laboratories over the coming years and if successful rolled out over the next five to 10 years," Ms McCosker said. 

Ms McCosker said the test, which is being funded by the Wesley Research Institute, should ultimately be applicable to other forms of cancer. 

She said breast cancer accounted for 28 per cent of diagnosed cancers in Australian women and 16 per cent of cancer associated deaths. 

Source: Queensland University of Technology [December 13, 2011]

Study finds link between air pollution and increase in DNA damage


A study in the Czech Republic has found a link between exposure to certain air pollutants and an increase in DNA damage for people exposed to high levels of the pollution. 


They found that breathing small quantities of a polycyclic aromatic hydrocarbon (PAH), called benzo[a]pyrene (B[a]P), caused an increase in the number of certain 'biomarkers' in DNA associated with a higher risk of diseases, including cancer. 

Air pollution is a major problem around the world, particularly in urban areas. In attempt to control regional air pollution levels, the EU has introduced legal limits for exposure to a variety of different airborne pollutants. For B[a]P , the EU air quality standard is 1 nanogram per metre3 (ng/m3) as an annual average that has to be attained where possible throughout the EU. 

To measure the risk of DNA damage and risk to health caused by exposure to chemicals, such as PAHs, researchers sometimes use 'biomarkers' – these are biological features that can provide an indicative picture of risk and disease. 

Previous studies have suggested that 'DNA adducts' can be used as biomarkers to measure exposure to PAHs. These are, in effect, small molecules, such as PAHs, bound to the DNA. Similarly, 'chromosomal aberrations' - structural changes to a stretch of DNA - can be used as biomarkers to demonstrate the effect of some pollutants on DNA. 

To test whether there was a possible link between exposure to PAHs and the frequency of DNA adducts and chromosomal aberrations, the researchers, supported by the EU EnviRisk and INTARESE projects, examined DNA from 950 police officers and bus drivers in Prague. 

The participants, drawn from three separate studies conducted over a five-year period, all worked outdoors for more than eight hours a day. Each carried a device to measure their personal exposure to PAHs and DNA was extracted from the participants' white blood cells. 

The researchers also tested a new technique for identifying chromosomal aberrations called 'fluorescence in-situ hybridisation', or FISH, which is much more sensitive than previous techniques. 

The results revealed, for the first time, a significant relationship between exposure to PAHs, the number of DNA adducts and the number of chromosomal aberrations detected using FISH. In particular, PAH levels and the occurrence of the two biomarkers were higher in winter than in summer. 

In one of the studies, average personal exposure to B[a]P and PAHs in January was measured as 1.58 ng/m3 and 9.07 ng/m3, respectively. In June, this dropped to 0.18 ng/m3 and 1.92 ng/m3. 

The number of B[a]P-like DNA adducts and chromosomal aberrations were correspondingly much higher in January than in June. In fact, the number of DNA adducts strongly mirrored exposure to PAHs in the past 30 days. 

These findings are of concern because exposure to more than 1 ng/m3 of B[a]P has been found to put people at higher risk of developing cancer later in life. 

Previous studies have shown that DNA adducts can be an indicator for cancer several years after exposure and the findings of this study indicate that DNA adduct biomarkers and chromosomal aberrations measured using FISH could help health authorities identify individuals at higher risk of disease.  

Source: Click Green [December 18, 2011]

12/03/2011

Vegetables, fruits, grains reduce stroke risk in women


Swedish women who ate an antioxidant-rich diet had fewer strokes regardless of whether they had a previous history of cardiovascular disease, in a study reported in Stroke: Journal of the American Heart Association. 


"Eating antioxidant-rich foods may reduce your risk of stroke by inhibiting oxidative stress and inflammation," said Susanne Rautiainen, M.Sc., the study's first author and Ph.D. student at the Karolinska Institutet in Sweden. "This means people should eat more foods such as fruits and vegetables that contribute to total antioxidant capacity." 

Oxidative stress is an imbalance between the production of cell-damaging free radicals and the body's ability to neutralize them. It leads to inflammation, blood vessel damage and stiffening. 

Antioxidants such as vitamins C and E, carotenoids and flavonoids can inhibit oxidative stress and inflammation by scavenging the free radicals. Antioxidants, especially flavonoids, may also help improve endothelial function and reduce blood clotting, blood pressure and inflammation. 

"In this study, we took into account all the antioxidants present in the diet, including thousands of compounds, in doses obtained from a usual diet," Rautiainen said. Researchers collected dietary data through a food-frequency questionnaire. They used a standard database to determine participants' total antioxidant capacity (TAC), which measures the free radical reducing capacity of all antioxidants in the diet and considers synergistic effects between substances. 

Researchers categorized the women according to their TAC levels — five groups without a history of cardiovascular disease and four with previous cardiovascular disease. 

For women with no history of cardiovascular disease who had the highest TAC, fruits and vegetables contributed about 50 percent of TAC. Other contributors were whole grains (18 percent), tea (16 percent) and chocolate (5 percent). 

The study found: 

  • Higher TAC was related to lower stroke rates in women without cardiovascular disease. 
  • Women without cardiovascular disease with the highest levels of dietary TAC had a statistically significant 17 percent lower risk of total stroke compared to those in the lowest quintile. 
  • Women with history of cardiovascular disease in the highest three quartiles of dietary TAC had a statistically significant 46 percent to 57 percent lower risk of hemorrhagic stroke compared with those in the lowest quartile. 

"Women with a high antioxidant intake may be more health conscious and have the sort of healthy behaviors that may have influenced our results," Rautiainen said. "However, the observed inverse association between dietary TAC and stroke persisted after adjustments for potential confounders related to healthy behavior such as smoking, physical activity and education." 

For the study, researchers used the Swedish Mammography Cohort to identify 31,035 heart disease-free women and 5,680 women with a history of heart disease in two counties. The women were 49-83 years old. 

Researchers tracked the cardiovascular disease-free women an average 11.5 years and the women with cardiovascular disease 9.6 years, from September 1997 through the date of first stroke, death or Dec. 31, 2009, whichever came first. 

Researchers identified 1,322 strokes among cardiovascular disease-free women and 1,007 strokes among women with a history of cardiovascular disease from the Swedish Hospital Discharge Registry. 

"To the best of our knowledge, no study has assessed the relation between dietary TAC and stroke risk in participants with a previous history of cardiovascular disease," Rautiainen said. "Further studies are needed to assess the link between dietary TAC and stroke risk in men and in people in other countries, but we think our results are applicable." 

Source: American Heart Association [December 01, 2011]

12/01/2011

Heart attack risk differs between men and women


Findings on coronary CT angiography (CTA), a noninvasive test to assess the coronary arteries for blockages, show different risk scenarios for men and women, according to a study presented today at the Radiological Society of North America (RSNA). 


Coronary artery disease (CAD) is a narrowing of the blood vessels that supply blood and oxygen to the heart. It is caused by a build-up of fat and other substances that form plaque on vessel walls. According to the Centers for Disease Control and Prevention, heart disease is the leading cause of death for both men and women in the U.S. 

Researchers at the Medical University of South Carolina analyzed the results of coronary CTA on 480 patients, mean age 55, with acute chest pain. Approximately 65 percent of the patients were women, and 35 percent were men. The possibility of acute coronary syndrome was ruled out for each of the patients. 

Using coronary CTA, the researchers were able to determine the number of vessel segments with plaque, the severity of the blockage and the composition of the plaque. 

"The latest CT scanners are able to produce images that allow us to determine whether the plaque is calcified, non-calcified or mixed," said John W. Nance Jr., M.D., currently a radiology resident at Johns Hopkins Hospital in Baltimore, Md. 

By comparing the coronary CTA results with outcome data over a 12.8-month follow-up period, the researchers were able to correlate the extent, severity and type of plaque build-up with the occurrence of major adverse cardiac events, such as a heart attack or coronary bypass surgery. The statistical analysis tested all plaques combined (calcified, non-calcified and mixed) and each individual plaque type separately. 

"We found that the risks for cardiovascular events associated with plaque were significantly different between women and men," Dr. Nance said. 

Within the follow-up period, 70 of the patients experienced major adverse cardiac events, such as death, heart attack, unstable angina or revascularization. In total, 87 major adverse cardiac events occurred among the patients during the follow-up period. 

When the outcome data were correlated with the CTA combined plaque findings, the results indicated that women with a large amount of plaque build-up and extensive atherosclerosis are at significantly greater cardiovascular risk than men. 

Specifically, the risk for major adverse cardiac events was significantly higher in women than in men when extensive plaque of any kind was present or when more than four artery segments were narrowed. 

"This research tells us that extensive coronary plaque is more worrisome in women than the equivalent amount in men," Dr. Nance said. 

However, when analyzing risk factors associated with the presence of individual types of plaque, the risk for major adverse cardiac events was greater in men, compared to women, when their artery segments contained non-calcified plaque. 

Dr. Nance said the new data suggested that the atherosclerotic process, or hardening of the arteries, is not necessarily linear and that more research is needed to better understand the disease. 

"Our research confirms that coronary CTA provides excellent prognostic information that helps identify risk, but there are gender differences that need to be considered," Dr. Nance said.  

Source: Radiological Society of North America [November 30, 2011]

11/30/2011

Violent video games alter brain function in young men


A functional magnetic resonance imaging (fMRI) analysis of long-term effects of violent video game play on the brain has found changes in brain regions associated with cognitive function and emotional control in young adult men after one week of game play. The results of the study were presented today at the annual meeting of the Radiological Society of North America (RSNA). 


The controversy over whether or not violent video games are potentially harmful to users has raged for many years, making it as far as the Supreme Court in 2010. But there has been little scientific evidence demonstrating that the games have a prolonged negative neurological effect. 

"For the first time, we have found that a sample of randomly assigned young adults showed less activation in certain frontal brain regions following a week of playing violent video games at home," said Yang Wang, M.D., assistant research professor in the Department of Radiology and Imaging Sciences at Indiana University School of Medicine in Indianapolis. "These brain regions are important for controlling emotion and aggressive behavior." 

For the study, 22 healthy adult males, age 18 to 29, with low past exposure to violent video games were randomly assigned to two groups of 11. Members of the first group were instructed to play a shooting video game for 10 hours at home for one week and refrain from playing the following week. The second group did not play a violent video game at all during the two-week period. 

Each of the 22 men underwent fMRI at the beginning of the study, with follow-up exams at one and two weeks. During fMRI, the participants completed an emotional interference task, pressing buttons according to the color of visually presented words. Words indicating violent actions were interspersed among nonviolent action words. In addition, the participants completed a cognitive inhibition counting task. 

The results showed that after one week of violent game play, the video game group members showed less activation in the left inferior frontal lobe during the emotional task and less activation in the anterior cingulate cortex during the counting task, compared to their baseline results and the results of the control group after one week. After the second week without game play, the changes to the executive regions of the brain were diminished. 

"These findings indicate that violent video game play has a long-term effect on brain functioning," Dr. Wang said. 

Source: Radiological Society of North America [November 30, 2011]

Eating fish reduces risk of Alzheimer's disease


People who eat baked or broiled fish on a weekly basis may be improving their brain health and reducing their risk of developing mild cognitive impairment (MCI) and Alzheimer's disease, according to a study presented today at the annual meeting of the Radiological Society of North America (RSNA). 


"This is the first study to establish a direct relationship between fish consumption, brain structure and Alzheimer's risk," said Cyrus Raji, M.D., Ph.D., from the University of Pittsburgh Medical Center and the University of Pittsburgh School of Medicine. "The results showed that people who consumed baked or broiled fish at least one time per week had better preservation of gray matter volume on MRI in brain areas at risk for Alzheimer's disease." 

Alzheimer's disease is an incurable, progressive brain disease that slowly destroys memory and cognitive skills. According to the National Institute on Aging, as many as 5.1 million Americans may have Alzheimer's disease. In MCI, memory loss is present but to a lesser extent than in Alzheimer's disease. People with MCI often go on to develop Alzheimer's disease. 

For the study, 260 cognitively normal individuals were selected from the Cardiovascular Health Study. Information on fish consumption was gathered using the National Cancer Institute Food Frequency Questionnaire. There were 163 patients who consumed fish on a weekly basis, and the majority ate fish one to four times per week. Each patient underwent 3-D volumetric MRI of the brain. Voxel-based morphometry, a brain mapping technique that measures gray matter volume, was used to model the relationship between weekly fish consumption at baseline and brain structure 10 years later. The data were then analyzed to determine if gray matter volume preservation associated with fish consumption reduced risk for Alzheimer's disease. The study controlled for age, gender, education, race, obesity, physical activity, and the presence or absence of apolipoprotein E4 (ApoE4), a gene that increases the risk of developing Alzheimer's. 

Gray matter volume is crucial to brain health. When it remains higher, brain health is being maintained. Decreases in gray matter volume indicate that brain cells are shrinking. 

The findings showed that consumption of baked or broiled fish on a weekly basis was positively associated with gray matter volumes in several areas of the brain. Greater hippocampal, posterior cingulate and orbital frontal cortex volumes in relation to fish consumption reduced the risk for five-year decline to MCI or Alzheimer's by almost five-fold. 

"Consuming baked or broiled fish promotes stronger neurons in the brain's gray matter by making them larger and healthier," Dr. Raji said. "This simple lifestyle choice increases the brain's resistance to Alzheimer's disease and lowers risk for the disorder." 

The results also demonstrated increased levels of cognition in people who ate baked or broiled fish. 

"Working memory, which allows people to focus on tasks and commit information to short-term memory, is one of the most important cognitive domains," Dr. Raji said. "Working memory is destroyed by Alzheimer's disease. We found higher levels of working memory in people who ate baked or broiled fish on a weekly basis, even when accounting for other factors, such as education, age, gender and physical activity." 

Eating fried fish, on the other hand, was not shown to increase brain volume or protect against cognitive decline. 

Source: Radiological Society of North America [November 30, 2011]

11/29/2011

Original Thinkers More Likely to Cheat, Study Finds


Creative people are more likely to cheat than less creative people, possibly because this talent increases their ability to rationalize their actions, according to research published by the American Psychological Association. 


"Greater creativity helps individuals solve difficult tasks across many domains, but creative sparks may lead individuals to take unethical routes when searching for solutions to problems and tasks," said lead researcher Francesca Gino, PhD, of Harvard University. 

Gino and her co-author, Dan Ariely, PhD, of Duke University, conducted a series of five experiments to test their thesis that more creative people would cheat under circumstances where they could justify their bad behavior. Their research was published online in APA's Journal of Personality and Social Psychology®. 

The researchers used a series of recognized psychological tests and measures to gauge research subjects' creativity. They also tested participants' intelligence. In each of the five experiments, participants received a small sum for showing up. Then, they were presented with tasks or tests where they could be paid more if they cheated. For example, in one experiment, participants took a general knowledge quiz in which they circled their answers on the test paper. Afterward, the experimenter told them to transfer their answers to "bubble sheets" -- but the experimenter told the group she had photocopied the wrong sheet and that the correct answers were lightly marked. The experimenters also told participants they would be paid more for more correct answers and led them to believe that they could cheat without detection when transferring their answers. However, all the papers had unique identifiers. 

The results showed the more creative participants were significantly more likely to cheat, and that there was no link between intelligence and dishonesty -- i.e., more intelligent but less creative people were not more inclined toward dishonesty. 

In another experiment, test subjects were shown drawings with dots on two sides of a diagonal line and asked to indicate whether there were more dots on the left side or right side. In half of 200 trials, it was virtually impossible to tell whether there were more dots on one side or another. However, participants were told they'd be paid 10 times as much (5 cents vs. 0.5 cents) for each time they said there were more dots on the right side. As predicted, the more creative participants were significantly more likely to give the answer that paid more. 

"Dishonesty and innovation are two of the topics most widely written about in the popular press," the authors wrote. "Yet, to date, the relationship between creativity and dishonest behavior has not been studied empirically. … The results from the current article indicate that, in fact, people who are creative or work in environments that promote creative thinking may be the most at risk when they face ethical dilemmas." 

The authors concede some important limitations in their work, most notably that they created situations in which participants were tempted by money to cheat. They suggested that future research should investigate whether creativity would lead people to satisfy selfish, short-term goals rather than their higher aspirations when faced with self-control dilemmas, such as eating a slice of cake when trying to lose weight. 

Source: American Psychological Association [November 28, 2011]

Study Looks at the Nature of Change in Our Aging, Changing Brains


As we get older, our cognitive abilities change, improving when we're younger and declining as we age. Scientists posit a hierarchical structure within which these abilities are organized. There's the "lowest" level -- measured by specific tests, such as story memory or word memory; the second level, which groups various skills involved in a category of cognitive ability, such as memory, perceptual speed, or reasoning; and finally, the "general," or G, factor, a sort of statistical aggregate of all the thinking abilities. 


What happens to this structure as we age? That was the question Timothy A. Salthouse, Brown-Forman professor of psychology at the University of Virginia, investigated in a new study appearing in an upcoming issue of Psychological Science, a journal published by the Association for Psychological Science. His findings advance psychologists' understanding of the complexities of the aging brain. 

"There are three hypotheses about how this works," says Salthouse. "One is that abilities become more strongly integrated with one another as we age." That theory suggests the general factor influences cognitive aging the most. The second -- based on the idea that connectivity among different brain regions lessens with age -- "is almost the opposite: that the changes in cognitive abilities become more rather than less independent with age." The third was Salthouse's hypothesis: The structure remains constant throughout the aging process. 

Using a sample of 1,490 healthy adults ages 18 to 89, Salthouse performed analyses of the scores on 16 tests of five cognitive abilities -- vocabulary, reasoning, spatial relations, memory, and perceptual speed. The primary analyses were on the changes in the test scores across an interval of about two and a half years. 

The findings confirmed Salthouse's hunch: "The effects of aging on memory, on reasoning, on spatial relations, and so on are not necessarily constant. But the structure within which these changes are occurring does not seem to change as a function of age." In normal, healthy people, "the direction and magnitude of change may be different" when we're 18 or 88, he says. "But it appears that the qualitative nature of cognitive change remains the same throughout adulthood." 

The study could inform other research investigating "what allows some people to age more gracefully than others," says Salthouse. That is, do people who stay mentally sharper maintain their ability structures better than those who become more forgetful or less agile at reasoning? And in the future, applying what we know about the structures of change could enhance "interventions that we think will improve cognitive functioning" at any age or stage of life. 

Source: Association for Psychological Science [November 22, 2011]

The ethics of smart drugs


Professor Barbara Sahakian, Professor of Clinical Neuropsychology at the University of Cambridge, has been researching cognitive enhancers for over a decade.  Here she discusses the emergence of ‘smart drugs’ and the ethical and practical issues they raise.  


There is an increasing lifestyle use of cognitive enhancing drugs, or smart drugs by healthy people. Why might this be? And how will it change our society? Are people using these drugs just realize their potential, or is it that pressures to perform in a globally competitive environment means that individuals’ feel that they cannot afford an ‘off day’ due to lack of sleep or stress?  This is perhaps particularly true of certain professions, where there are issues of safety to oneself or others, such as the military, doctors, etc. 

Caffeine is the current stimulant of use for many people, as it is widely available, and effective: however, its wakefulness-promoting effects are transient. For doctors there is the undesirable side-effect of tremors at the dose required for maximum effects (600mg), which is common. Therefore, it is useful to examine whether there are more effective cognitive enhancers, with fewer detrimental side effects, for those whose failures in attention, concentration and problem solving may lead to deleterious effects, including jeopardy of safety in the military arena, or serious adverse events during operations. 

Although measures to reduce doctors’ working hours have been implemented in both the United Sates and Europe, surgeons performing long, arduous operations remain susceptible to the effects of fatigue, and frequent transitions from day to night work expose junior doctors to the risk of impaired psychomotor performance. Indeed, fatigued doctors risk making poor judgements and committing serious medical errors. Given the continued need for innovation in this area, pharmacological methods could conceivably be used to combat fatigue at some time in the future. 

In an exciting collaboration between the University of Cambridge, Department of Psychiatry and the Imperial College London, Division of Surgery, it has recently been discovered that the ‘smart drug’ Modafinil improves cognitive flexibility and reduces impulsivity in sleep deprived doctors. These results have just been published in the journal Annals of Surgery. 

In a proof of concept randomized placebo controlled study, run by Charlotte Housden (Cambridge) and Dr Colin Sugden (Imperial), 39 doctors were deprived of sleep overnight and given a dose of 200mg of Modafinil or placebo. The doctors taking Modafinil had cognitive improvements, including flexibility of thinking, and reduced impulsivity. These executive functions are clearly important for conducting surgical operations under stress and time pressure. However, there was no change on their clinical psychomotor performance on a laparoscopic task, which mimics and measures the dexterity necessary to perform surgery. 

While a chronic Modafinil study is required to determine the long term effects of the drug as a safe and effective means of improving cognitive impairment due to sleep deprivation, this acute study has demonstrated that benefits are obvious on at least the first occasion. 

Given these important findings, it is possible to speculate that doctors who take these drugs may be able to plan an intervention more effectively or show greater cognitive flexibility when approaching a challenging clinical problem.  However, it is critical that the long-term safety of the use of these drugs in healthy people remains to be determined. 

The many ethical discussions that I and Charlotte Housden have had with the public on the use of cognitive enhancing drugs by healthy people have been revealing.  A variety of views have emerged, ranging from ‘These drugs should only be used by people with neuropsychiatric disorders such as Attention Deficit Hyperactivity disorder, or Alzheimer’s Disease’, to ‘If they are safe, why not use them, to make up for fatigue, to improve memory or other forms of cognition?’, and ‘Why not use them to get in an especially productive working day?’. 

This increasing lifestyle use has to be balanced against other important facets of life, such as a good work/life balance. The possibility to accelerate into a 24/7 society for many people is a serious concern, as are issues of cheating and coercion. As a society, we certainly need to be concerned about the use of these drugs by healthy children and adolescents where their brains are still in development. Furthermore, the purchasing of prescription medication over the internet is dangerous. However, if long-term safety and efficacy are proven in healthy people, it may well be, at least for certain segments of the population, these drugs will prove life-savers. 

Source: University of Camridge [October 31, 2011]

11/28/2011

Denying mental qualities to animals in order to eat them

New research by Dr Brock Bastian from UQ's School of Psychology highlights the psychological processes that people engage in to reduce their discomfort over eating meat. 


This paper will be published in an upcoming edition of the Personality and Social Psychology Bulletin, where Dr Bastian and his co-authors show that people deny mental qualities to animals they eat. 

"Many people like eating meat, but most are reluctant to harm things that have minds. Our studies show that this motivates people to deny minds to animals," Dr Bastian said. 

The research demonstrates when people are confronted with the harm that their meat-eating brings to food animals they view those animals as possessing fewer mental capacities compared to when they are not reminded. 

The findings also reveal that this denial of mind to food animals is especially evident when people expect to eat meat in the near future. 

Dr Bastian said it shows that denying mind to animals that are used for food makes it less troublesome for people to eat them. 

"Meat is central to most people's diets and a focus of culinary enjoyment, yet most people also like animals and are disturbed by harm done to them; therefore creating a 'meat paradox' - people's concern for animal welfare conflicts with their culinary behavior. 

"For this reason, people rarely enjoy thinking about where meat comes from, the processes it goes through to get to their tables, or the living qualities of the animals from which it is extracted," he said. 

Dr Bastian's research argues that meat eaters go to great lengths to overcome these inconsistencies between their beliefs and behaviours. 

"In our current research we focus on the processes by which people facilitate their practice of eating meat. People often mentally separate meat from animals so they can eat pork or beef without thinking about pigs or cows. 

"Denying minds to animals reduces concern for their welfare, justifying the harm caused to them in the process of meat production," he added. 

Meat is pleasing to the palate for many, and although the vegetarian lifestyle is increasingly popular, most people continue to make meat a central component of their diet. 

"In short, our work highlights the fact that although most people do not mind eating meat, they do not like thinking of animals they eat as having possessed minds," Dr Bastian said.  

Source: University of Queensland [November 25, 2011]

11/17/2011

Today's teens will die younger of heart disease


A new study that takes a complete snapshot of adolescent cardiovascular health in the United States reveals a dismal picture of teens who are likely to die of heart disease at a younger age than adults do today, reports Northwestern Medicine research. 


"We are all born with ideal cardiovascular health, but right now we are looking at the loss of that health in youth," said Donald Lloyd-Jones, M.D., chair and associate professor of preventive medicine at Northwestern University Feinberg School of Medicine and a physician at Northwestern Memorial Hospital. "Their future is bleak." 

Lloyd-Jones is the senior investigator of the study presented Nov. 16 at the American Heart Association Scientific Sessions in Orlando. 

The effect of this worsening teen health is already being seen in young adults. For the first time, there is an increase in cardiovascular mortality rates in younger adults ages 35 to 44, particularly in women, Lloyd-Jones said. 

The alarming health profiles of 5,547 children and adolescents, ages 12 to 19, reveal many have high blood sugar levels, are obese or overweight, have a lousy diet, don't get enough physical activity and even smoke, the new study reports. These youth are a representative sample of 33.1 million U.S. children and adolescents from the 2003 to 2008 National Health and Nutrition Examination Surveys. 

"Cardiovascular disease is a lifelong process," Lloyd-Jones said. "The plaques that kill us in our 40s and 50s start to form in adolescence and young adulthood. These risk factors really matter." 

"After four decades of declining deaths from heart disease, we are starting to lose the battle again," Lloyd-Jones added. 

The American Heart Association (AHA) defines ideal cardiovascular health as having optimum levels of seven well-established cardiovascular risk factors, noted lead study author Christina Shay, who did the research while she was a postdoctoral fellow in preventive medicine at Northwestern's Feinberg School. Shay now is an assistant professor of epidemiology at the University of Oklahoma Health Sciences Center. 

"What was most alarming about the findings of this study is that zero children or adolescents surveyed met the criteria for ideal cardiovascular health," Shay said. "These data indicate ideal cardiovascular health is being lost as early as, if not earlier than the teenage years." 

The study used measurements from the AHA's 2020 Strategic Impact Goals for monitoring cardiovascular health in adolescents and children. Among the findings: 

Terrible Diets

All the 12-to-19-year-olds had terrible diets, which, surprisingly, were even worse than those of adults, Lloyd-Jones said. None of their diets met all five criteria for being healthy. Their diets were high in sodium and sugar-sweetened beverages and didn't include enough fruits, vegetables, fiber or lean protein. 

"They are eating too much pizza and not enough whole foods prepared inside the home, which is why their sodium is so high and fruit and vegetable content is so low," Lloyd-Jones said. 

High Blood Sugar

More than 30 percent of boys and more than 40 percent of girls have elevated blood sugar, putting them at high risk for developing type 2 diabetes. 

Overweight or Obese

Thirty-five percent of boys and girls are overweight or obese. "These are startling rates of overweight and obesity, and we know it worsens with age," Lloyd-Jones said. "They are off to a bad start." 


Low Physical Activity

Approximately 38 percent of girls had an ideal physical activity level compared to 52 percent of boys. 

High Cholesterol

Girls' cholesterol levels were worse than boys'. Only 65 percent of girls met the ideal level compared to 73 percent of boys. 

Smoking

Almost 25 percent of teens had smoked within the past month of being surveyed. 

Blood Pressure

Most boys and girls (92.9 percent and 93.4 percent, respectively) had an ideal level of blood pressure. 

The problem won't be easy to fix. "We are much more sedentary and get less physical activity in our daily lives," Lloyd-Jones said. "We eat more processed food, and we get less sleep. It's a cultural phenomenon, and the many pressures on our health are moving in a bad direction. This is a big societal problem we must address." 

Author: Marla Paul | Source: Northwestern University [November 16, 2011]

Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | Facebook Themes